Stochastic approximation method using diagonal positive-definite matrices for convex optimization with fixed point constraints

نویسندگان

چکیده

Abstract This paper proposes a stochastic approximation method for solving convex optimization problem over the fixed point set of quasinonexpansive mapping. The proposed is based on existing adaptive learning rate algorithms that use certain diagonal positive-definite matrices training deep neural networks. includes convergence analyses and under specific assumptions. Results show any accumulation sequence generated by with diminishing step-sizes almost surely belongs to solution in learning. Additionally, we apply methods classifier ensemble problems conduct numerical performance comparison showing achieve high accuracies faster than method.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Proximal point algorithms for nonsmooth convex optimization with fixed point constraints

The problem of minimizing the sum of nonsmooth, convex objective functions defined on a real Hilbert space over the intersection of fixed point sets of nonexpansive mappings, onto which the projections cannot be efficiently computed, is considered. The use of proximal point algorithms that use the proximity operators of the objective functions and incremental optimization techniques is proposed...

متن کامل

Stochastic Successive Convex Approximation for Non-Convex Constrained Stochastic Optimization

This paper proposes a constrained stochastic successive convex approximation (CSSCA) algorithm to find a stationary point for a general non-convex stochastic optimization problem, whose objective and constraint functions are nonconvex and involve expectations over random states. The existing methods for non-convex stochastic optimization, such as the stochastic (average) gradient and stochastic...

متن کامل

Online Convex Optimization with Stochastic Constraints

This paper considers online convex optimization (OCO) with stochastic constraints, which generalizes Zinkevich’s OCO over a known simple fixed set by introducing multiple stochastic functional constraints that are i.i.d. generated at each round and are disclosed to the decision maker only after the decision is made. This formulation arises naturally when decisions are restricted by stochastic e...

متن کامل

Gyrovector Spaces on the Open Convex Cone of Positive Definite Matrices

‎In this article we review an algebraic definition of the gyrogroup and a simplified version of the gyrovector space with two fundamental examples on the open ball of finite-dimensional Euclidean spaces‎, ‎which are the Einstein and M"{o}bius gyrovector spaces‎. ‎We introduce the structure of gyrovector space and the gyroline on the open convex cone of positive definite matrices and explore its...

متن کامل

A Riemannian low-rank method for optimization over semidefinite matrices with block-diagonal constraints

We propose a new algorithm to solve optimization problems of the form min f(X) for a smooth function f under the constraints that X is positive semidefinite and the diagonal blocks ofX are small identity matrices. Such problems often arise as the result of relaxing a rank constraint (lifting). In particular, many estimation tasks involving phases, rotations, orthonormal bases or permutations fi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Fixed Point Theory and Algorithms for Sciences and Engineering

سال: 2021

ISSN: ['2730-5422']

DOI: https://doi.org/10.1186/s13663-021-00695-3